false negative

Terms from Artificial Intelligence: humans at the heart of algorithms

Page numbers are for draft copy at present; they will be replaced with correct numbers when final book is formatted. Chapter numbers are correct and will not change now.

A false negative is when a classification or decision system gives a negative result when it should really be positive; for example if a medical diagnosis system says someone is well, when they actually have the ailment. This is in contrast to with a true negative, a false positive or a true positive.

Used on Chap. 9: pages 181, 196; Chap. 18: pages 450, 451; Chap. 19: page 473; Chap. 20: page 506